Nonlinear least squares and Sobolev gradients
نویسنده
چکیده
Least squares methods are effective for solving systems of partial differential equations. In the case of nonlinear systems the equations are usually linearized by a Newton iteration or successive substitution method, and then treated as a linear least squares problem. We show that it is often advantageous to form a sum of squared residuals first, and then compute a zero of the gradient with a Newton-like method. We present an effective method, based on Sobolev gradients, for treating the nonlinear least squares problem directly. The method is based on trust-region subproblems defined by a Sobolev norm and solved by a preconditioned conjugate gradient method with an effective preconditioner that arises naturally from the Sobolev space setting. The trust-region method is shown to be equivalent to a Levenberg-Marquardt method which blends a Newton or Gauss-Newton iteration with a gradient descent iteration, but uses a Sobolev gradient in place of the Euclidean gradient. We also provide an introduction to the Sobolev gradient method and discuss its relationship to operator preconditioning with equivalent operators. keywords: Gauss-Newton; least squares; Levenberg-Marquardt; operator preconditioning; Sobolev gradient; trust region
منابع مشابه
High-order Sobolev preconditioning
This paper compares the use of firstand second-order Sobolev gradients to solve differential equations using the method of least-squares steepest descent. The use of high-order Sobolev gradients offers a very effective preconditioning strategy for the linear part of a nonlinear differential equation. 2005 Elsevier Ltd. All rights reserved.
متن کاملA Sobolev Gradient Method for Treating the Steady-state Incompressible Navier-Stokes Equations
The velocity-vorticity-pressure formulation of the steady-state incompressible Navier-Stokes equations in two dimensions is cast as a nonlinear least squares problem in which the functional is a weighted sum of squared residuals. A finite element discretization of the functional is minimized by a trust-region method in which the trust-region radius is defined by a Sobolev norm and the trust-reg...
متن کاملSuperlinearly convergent exact penalty projected structured Hessian updating schemes for constrained nonlinear least squares: asymptotic analysis
We present a structured algorithm for solving constrained nonlinear least squares problems, and establish its local two-step Q-superlinear convergence. The approach is based on an adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method of Coleman and Conn for nonlinearly constrained optimization problems. The structured adaptation also makes use of the ideas of N...
متن کاملSobolev gradients: a nonlinear equivalent operator theory in preconditioned numerical methods for elliptic PDEs
Solution methods for nonlinear boundary value problems form one of the most important topics in applied mathematics and, similarly to linear equations, preconditioned iterative methods are the most efficient tools to solve such problems. For linear equations, the theory of equivalent operators in Hilbert space has proved an efficient organized framework for the study of preconditioners [6, 9], ...
متن کاملRegularization of Wavelet Approximations
In this paper, we introduce nonlinear regularized wavelet estimators for estimating nonparametric regression functions when sampling points are not uniformly spaced. The approach can apply readily to many other statistical contexts. Various new penalty functions are proposed. The hard-thresholding and soft-thresholding estimators of Donoho and Johnstone are speci c members of nonlinear regular...
متن کامل